A Derivative-free Method for Linearly Constrained Nonsmooth Optimization

نویسندگان

  • Adil M. Bagirov
  • Moumita Ghosh
  • Dean Webb
  • Duan Li
چکیده

This paper develops a new derivative-free method for solving linearly constrained nonsmooth optimization problems. The objective functions in these problems are, in general, non-regular locally Lipschitz continuous function. The computation of generalized subgradients of such functions is difficult task. In this paper we suggest an algorithm for the computation of subgradients of a broad class of non-regular locally Lipschitz continuous functions. This algorithm is based on the notion of a discrete gradient. An algorithm for solving linearly constrained nonsmooth optimization problems based on discrete gradients is developed. We report preliminary results of numerical experiments. These results demonstrate that the proposed algorithm is efficient for solving linearly constrained nonsmooth optimization problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Derivative-Free Algorithm for Inequality Constrained Nonlinear Programming via Smoothing of an linfty Penalty Function

In this paper we consider inequality constrained nonlinear optimization problems where the first order derivatives of the objective function and the constraints cannot be used. Our starting point is the possibility to transform the original constrained problem into an unconstrained or linearly constrained minimization of a nonsmooth exact penalty function. This approach shows two main difficult...

متن کامل

A progressive barrier derivative-free trust-region algorithm for constrained optimization

We study derivative-free constrained optimization problems and propose a trust-region method that builds linear or quadratic models around the best feasible and and around the best infeasible solutions found so far. These models are optimized within a trust region, and the progressive barrier methodology handles the constraints by progressively pushing the infeasible solutions toward the feasib...

متن کامل

A Linesearch-Based Derivative-Free Approach for Nonsmooth Constrained Optimization

In this paper, we propose new linesearch-based methods for nonsmooth constrained optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence towards stationary points, using the Clarke-Jahn directional derivative. In the second part, we consider inequal...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

A derivative-free algorithm for nonlinear programming

In this paper we consider nonlinear constrained optimization problems in case where the first order derivatives of the objective function and the constraints can not be used. Up to date only a few approaches have been proposed for tackling such a class of problems. In this work we propose a new algorithm. The starting point of the proposed approach is the possibility to transform the original c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006